Trajectory versus probability density entropy

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Trajectory versus probability density entropy.

We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the ...

متن کامل

Trajectory versus Probability Density Entropy Typeset Using Revt E X 1

We study the problem of entropy increase of the Bernoulli-shift map without recourse to the concept of trajectory and we discuss whether, and under which conditions if it does, the distribution density entropy coincides with the Kolmogorov-Sinai entropy, namely, with the trajectory entropy. 05.45.+b,03.65.Sq,05.20.-y Typeset using REVTEX 1

متن کامل

Trajectory probability hypothesis density filter

This paper presents the probability hypothesis density (PHD) filter for sets of trajectories. The resulting filter, which is referred to as trajectory probability density filter (TPHD), is capable of estimating trajectories in a principled way without requiring to evaluate all measurement-to-target association hypotheses. As the PHD filter, the TPHD filter is based on recursively obtaining the ...

متن کامل

Exact Probability Distribution versus Entropy

The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both firstand second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, a...

متن کامل

Probability Density Estimation Using Entropy Maximization

We propose a method for estimating probability density functions and conditional density functions by training on data produced by such distributions. The algorithm employs new stochastic variables that amount to coding of the input, using a principle of entropy maximization. It is shown to be closely related to the maximum likelihood approach. The encoding step of the algorithm provides an est...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Physical Review E

سال: 2001

ISSN: 1063-651X,1095-3787

DOI: 10.1103/physreve.64.016223